EXACT CLASSIFICATION WITH TWO-LAYERED PERCEPTRONS

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Memorandum CaSaR 92 - 25 Exact Classification With Two - Layered Perceptrons

We study the capabilities of two-layered perceptrons for classifying exactly a given subset. Both necessary and sufficient conditions are derived for subsets to be exactly classifiable with two-layered perceptrons that use the hard-limiting response function. The necessary conditions can be viewed as generalizations of the linear-separability condition of one-layered perceptrons and confirm the...

متن کامل

Classification using multi-layered perceptrons

There has been an increasing interest in the applicability of neural networks in disparate domains. In this paper, we describe the use of multi-layered perceptrons, a type of neural network topology, for financial classification problems, with promising results. Backpropagation, which is the learning algorithm most often used in multilayered perceptrons, however, is inherently an inefficient se...

متن کامل

On the equivalence of two-layered perceptrons with binary neurons

We consider two-layered perceptions consisting of N binary input units, K binary hidden units and one binary output unit, in the limit N >> K > or = 1. We prove that the weights of a regular irreducible network are uniquely determined by its input-output map up to some obvious global symmetries. A network is regular if its K weight vectors from the input layer to the K hidden units are linearly...

متن کامل

Classification of Digital Modulation Schemes Using Multi-layered Perceptrons

Automatic classification of modulation schemes is of interest for both civilian and military applications. This report describes an experiment classifying six modulation schemes using a Multi-Layered Perceptron (MLP) neural network. Six key features were extracted from the signals and used as inputs to the MLP. The approach was similar to that of Azzouz and Nandi [2]. The aim was to see how the...

متن کامل

Generalization and capacity of extensively large two-layered perceptrons.

The generalization ability and storage capacity of a treelike two-layered neural network with a number of hidden units scaling as the input dimension is examined. The mapping from the input to the hidden layer is via Boolean functions; the mapping from the hidden layer to the output is done by a perceptron. The analysis is within the replica framework where an order parameter characterizing the...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: International Journal of Neural Systems

سال: 1992

ISSN: 0129-0657,1793-6462

DOI: 10.1142/s0129065792000127